AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
PubMed Pretraining

# PubMed Pretraining

Biomistral 7B AWQ QGS128 W4 GEMM
Apache-2.0
BioMistral is an open-source medical domain model suite based on the Mistral architecture, further pretrained using PubMed Central open-access text data.
Large Language Model Transformers Supports Multiple Languages
B
BioMistral
224
5
Biogpt Large
MIT
BioGPT is a domain-specific generative pre-trained Transformer language model trained on large-scale biomedical literature, focusing on biomedical text generation and mining.
Large Language Model Transformers English
B
microsoft
7,869
196
Biom ALBERT Xxlarge
Large-scale biomedical language model based on BERT, ALBERT, and ELECTRA, specialized for biomedical domain tasks
Large Language Model Transformers
B
sultan
77
2
Biomednlp BiomedBERT Base Uncased Abstract
MIT
A biomedical domain-specific BERT model pretrained on PubMed article abstracts, achieving state-of-the-art performance in multiple biomedical NLP tasks
Large Language Model English
B
microsoft
240.01k
74
Biobert Large Cased V1.1 Squad
BioBERT is a BERT-based pretrained language model specifically optimized for biomedical text mining tasks, suitable for question answering systems.
Question Answering System
B
dmis-lab
1,227
18
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase